On Iterative Krylov-Dogleg Trust-Region Steps for Solving Neural Networks Nonlinear Least Squares Problems
نویسندگان
چکیده
This paper describes a method of dogleg trust-region steps, or restricted Levenberg-Marquardt steps, based on a projection process onto the Krylov subspaces for neural networks nonlinear least squares problems. In particular, the linear conjugate gradient (CG) method works as the inner iterative algorithm for solving the linearized Gauss-Newton normal equation, whereas the outer nonlinear algorithm repeatedly takes so-called "Krylov-dogleg" steps, relying only on matrix-vector multiplication without explicitly forming the Jacobian matrix or the Gauss-Newton model Hessian. That is, our iterative dogleg algorithm can reduce both operational counts and memory space by a factor of O(n) (the number of parameters) in comparison with a direct linear-equation solver. This memory-less property is useful for large-scale problems.
منابع مشابه
Jacobian-Free Three-Level Trust Region Method for Nonlinear Least Squares Problems
Nonlinear least squares (NLS) problems arise in many applications. The common solvers require to compute and store the corresponding Jacobian matrix explicitly, which is too expensive for large problems. In this paper, we propose an effective Jacobian free method especially for large NLS problems because of the novel combination of using automatic differentiation for J(x)v and J (x)v along with...
متن کاملIterative Scaled Trust-Region Learning in Krylov Subspaces via Pearlmutter's Implicit Sparse Hessian-Vector Multiply
The online incremental gradient (or backpropagation) algorithm is widely considered to be the fastest method for solving large-scale neural-network (NN) learning problems. In contrast, we show that an appropriately implemented iterative batch-mode (or block-mode) learning method can be much faster. For example, it is three times faster in the UCI letter classification problem (26 outputs, 16,00...
متن کاملNewton-Krylov Type Algorithm for Solving Nonlinear Least Squares Problems
The minimization of a quadratic function within an ellipsoidal trust region is an important subproblem for many nonlinear programming algorithms. When the number of variables is large, one of the most widely used strategies is to project the original problem into a small dimensional subspace. In this paper, we introduce an algorithm for solving nonlinear least squares problems. This algorithm i...
متن کاملInexact Newton Dogleg Methods
The dogleg method is a classical trust-region technique for globalizing Newton’s method. While it is widely used in optimization, including large-scale optimization via truncatedNewton approaches, its implementation in general inexact Newton methods for systems of nonlinear equations can be problematic. In this paper, we first outline a very general dogleg method suitable for the general inexac...
متن کاملAssessments of Nonlinear Least Squares Methods for Uav Vision Based Navigation
In recent years, the UAV’s are increasingly becoming main part of both military and commercial operations, where accurate pose estimation is considered as a critical problem to be investigated. UAV pose estimation problem can be investigated through vision based navigation (VBN) approach where visual sensors are augmented with the traditional IMU unit. VBN is based on localizing set of features...
متن کامل